Shared posts

20 Jan 07:55

DIY Pipe Freezing Kit

by James Hobson
Miguel A. Arévalo

Curioso, en España lo he visto nunca... pero puedo encontrar muchas razones para su uso en instalaciones viejas...

icemelt

Have you ever needed to freeze a water pipe to do a quick plumbing job without shutting off all the water? It’s actually a fairly common practice for contractors, except they use a rather expensive tool to do it. As it turns out, there’s actually a fairly cheap and easy DIY solution you can do with minimal supplies or experience.

[Go Repairs] shows us that all you need is some pipe insulation (or a large sponge), a plastic bag, two zip-ties, and a air duster can. Wrap the insulation and plastic bag around the pipe, and zip-tie it in place. Holding the air duster can upside down, release the majority of the contents into the insulation. Congratulations, you’ve just frozen your pipe.

It works by exploiting the properties of a compressed gas — in this case tetrafluoroethane — the “air” in the air duster.

When this gas is under pressure it exists as both a liquid and a gas in the can. When you spray the can normally, you’re just getting the gas out — but if you hold the can upside down, the liquid comes out instead. Because liquid tetrafluoroethane can’t exist at atmospheric pressure as a liquid, it boils instantly as it leaves the can. This causes an extreme temperature drop in the surroundings due to the endothermic reaction occurring.

For a complete explanation, check out the video below.


Filed under: home hacks
20 Jan 07:51

First Firefox OS Running Tablet Leaks Out [Image and Specs] - Gizbot

Miguel A. Arévalo

Teles y ahora tabletas... qué bien me cae FirefoxOS...


Gizbot

First Firefox OS Running Tablet Leaks Out [Image and Specs]
Gizbot
It seems the Firefox OS powered tablet is already on its way to market. Mozilla's very own software developer Asa Dotzle via a blog post revealed the tablet late yesterday. The person who had a greater hand in developing the Firefox browser, has ...

16 Jan 09:01

The Essence of Antitrust and Regulation

by Don Boudreaux
Miguel A. Arévalo

Hay mucho ingenuo engañado con lo de la neutralidad de la red, no es solo que como principio sea erróneo, es que en este caso se beneficia a los operadores actuales frente a nuevos emprendedores que pueden entrar mejorando otros servicios sin necesidad de desplegar redes brutales (el caso típico es NetFlix). Y sin entrar en el debate técnico de lo poco efectivo de las opciones propuestas.

(Don Boudreaux)

Here are the opening lines of a report in today’s New York Times about a recent applause-worthy court decision:

Internet service providers are free to make deals with services like Netflix or Amazon allowing those companies to pay to stream their products to online viewers through a faster, express lane on the web, a federal appeals court ruled on Tuesday.

Federal regulators had tried to prevent those deals, saying they would give large, rich companies an unfair edge in reaching consumers.

Although the court decision and regulation in question are not formally about antitrust, the last sentence quoted above reveals the essence of antitrust – from its 1889 inception in the United States and throughout its history.  That essence is cronyism.  That essence is an effort to suppress competition – an effort that, of course, occurs also through regulatory channels, such as the Federal Communications Commission, other than ones that are formally “antitrust.”

Antitrust began in 1889, with the enactment of a dozen or so state statutes, as an effort to protect economically inefficient but politically influential producers (chiefly, local butchers) from the competition of more economically efficient but politically less popular producers (chiefly, the newly emergent Chicago meatpackers with their newfangled refrigerated railroad cars).  (See also here.)  The 1890 Sherman Act, at the national level, reflected this populist reaction against market-unleashed creative destruction.  The absurdity of antitrust reached its legislative zenith with the 1936 Robinson-Patman Act.

Of course, because it’s good p.r. for government (and for its cronies) to portray competition-suppressing intrusions as being pro-consumer – and because clever minds can always spin ingenious hypotheticals in which voluntary consumer actions today lead to consumer harms tomorrow – antitrust and other competition-suppressing exercises of government power almost always are served up with proclamations about how these or those government prohibitions of capitalist acts among consenting adults will protect consumers from being devoured by various species of nasty devils.

….

Read again the last sentence quoted above from the New York Times report:

Federal regulators had tried to prevent those deals, saying they would give large, rich companies an unfair edge in reaching consumers.

Translation:

Federal regulators had tried to prevent those deals, saying they would give large, rich companies an edge – earned through, and evidenced by, these companies’ past successes – in pleasing consumers.

If you worry that content providers such as Netflix and Amazon, now that they are “large” and “rich,” possess some “unfair” ability to rig markets in their favor at the expense of consumers, ask why internet service providers would go along with efforts by these companies to profit at the expense of consumers.  Internet service providers have every incentive to keep their products as attractive as possible to consumers.  Even if individual consumers have neither the incentive nor the ability to act today in ways that prevent “large, rich” companies from screwing them tomorrow, internet service providers certainly have such incentives and abilities.  (Andy Kleit and I wrote on this matter, in the context of predatory pricing, years ago.  See also here.)  But internet service providers also have incentives and abilities to craft deals with “large, rich” content providers such as Netflix and Amazon that improve consumer welfare even if such deals make life more difficult for less able and less successful content providers.

Government involvement in this matter will almost certainly rig the market in anti-competitive ways for political cronies.  The history of regulation in general, and of antitrust, in particular, gives no reason to think differently.

I would be remiss if I failed to mention here the pioneering work on antitrust by Dom Armentano.

16 Jan 07:21

Google Play Movies & TV no longer limited to Android, makes it onto iOS

by Nate Swanner
Miguel A. Arévalo

Es un poco alucinante que quien escriba este artículo no sepa ni diga cosas como que no se pueden comprar contenidos por las restricciones de Apple y que esa es la razón sobre todo de que se haya retrasado... restricción que Apple no tiene en Android pero se niega a sacar el iTunes para la plataforma número 1.

Google Play Music and Movies has come to iOS. Rather than keeping to Android, Play Music and Movies have made their way to the App Store. This is another in many steps for Google to become agnostic about OS, and more concentrated on services.

Play Movies & TV iOS


While Play Music and Movies pale in comparison to media content from others, specifically Apple’s own ecosystem, it offers up a compelling reason to make it an option for iOS users. With the new app comes Chromecast suppport. Like third party apps on iOS, Play Music and Movies offers the ability to stream to a Google product.

Google has long maintained they are concerned more with services than operating systems, and this move — maybe more than any other — highlights that. The app itself is pretty much like the Android version, so those who use both operating systems will see little difference between the two. Of course, as mentioned, you can Chromecast with this one.

Aside from Netflix and HBO GO, we’re unaware of another Chromecast app for iOS. This could be the first media rental/purchase service which allows Chromecasting on iOS, so groundbreaking in its own right for that. The app is available for download now, bringing the mobile world that much closer together.
VIA: Google

15 Jan 13:04

Testing Out: How to “Moneyball” Your Way to a Debt-Free College Degree

by A Manly Guest Contributor
Miguel A. Arévalo

A pesar del destrozo que está hacieno Sallie Mae en el mercado educativo americano... el mercado encuentra fórmulas para dar precios competitivos... a quien los busca...

Victoria University students in their rooms

Editor’s Note: This is a guest post from Jay Cross.

I don’t know about you, but I’m fed up with the ancient “college savings tips” so-called experts keep force-feeding us:

  • “Fill out the FAFSA before senior year to maximize aid eligibility!”
  • “Buy used textbooks, you’ll save hundreds!”
  • “Apply for scholarships. Try FastWeb.com!”

If you follow this advice, you will be thoroughly and totally prepared for college…in 1995. (You know, just in case that year ever comes back.) But in 2013, these strategies will get you slaughtered by the “Student Loan Industrial Complex.”

The Stark Truth About Yesterday’s College Tips

  • Many students don’t fill out the FAFSA at all, nevermind early. I didn’t. It’s long, tedious, and confusing. Even if you do, the skyrocketing cost of college makes student loans a “lose/lose” proposition. Either you don’t qualify (in which case you can’t get aid) or you do qualify (in which case you probably shouldn’t take it.) Today’s loans are causing some graduates to commit ACTUAL suicide after committing financial suicide.
  • Textbooks, while 812% pricier than 30 years ago, are a tiny portion of your total college costs. Every penny helps…but buying used books doesn’t pay for college any more than clipping coupons pays for a mansion.
  • Most people who apply for scholarships get nothing. It’s a near-universal waste of time.

Continuing to preach this outdated advice is worse than useless: it’s condescending. It actually patronizes how challenging it is to finance your education. Today’s students don’t need the marginal cost reducers of the past. They need to completely rethink their strategy and beat colleges at their own game.

In this post, I am going to share an uncommon approach that can help you graduate faster, avoid student debt, and jump into the job market with both feet beneath you.

Testing Out: The Best Kept Secret in Higher Education

Let’s say you’re just starting college and you need to earn all of those major-independent “general education” credits (i.e., English 101) that every student takes.

What do you do?

Option #1: Take a standard English course. You know what that means: classes, homework, tests, quizzes, projects, group assignments…ugh. 95% of students assume this is their only option.

Time: 4 months

Cost: $3,000+

Option #2: Take CLEP tests instead. These multiple-choice exams cover a full semester of material…and if you pass, you get the same credits you would have spent months in a classroom for.

Time: 2-3 hours

Cost: $80

CLEP offers 33 exams in five subject areas, is accepted for credit by 2,900 colleges and universities, and is proctored in over 1,800 test centers nationwide. Developed by College Board — the same organization behind AP courses and the SAT — CLEP measures your knowledge regardless of how you obtained it: independent study, internships, and work experience included.

CLEP is the most popular exam option, but not the only. Other formats include DSST, Excelsior/UExcel, and Thomas Edison State College (TECEP) exams. Every school decides for itself which tests to offer and how much of your degree you can test out of. At some schools, it’s 30 credits. At others, it might be 45. Still others, 60 or 90 or even the full 120 credits of a bachelor’s degree.

Regardless of the exact number of exam credits your school lets you earn, the savings potential is massive. And you don’t need to pray for scholarships or nebulous “aid” to get it. Testing out is completely within your locus of control: you just need to plan, prioritize and study like a gentleman scholar.

Even if your school only allows 30 exam credits, that’s still a quarter of your degree. You will save nearly $30,000 and graduate faster by not earning those credits in the classroom. I completed the final 30 credits of my degree via test-out and was the only person I know to graduate 100% debt-free. (Here’s my story, if you’re interested.)

Some people have heard of testing out, but most haven’t. Schools don’t shout from the rooftops about their test-out allowances. Why would they? Higher education is a business: the longer they can “keep you in the store” with costly courses, the more money they make. But this information is buried in your academic handbook under “alternative credit policies” or “transfer credit allowances.” Look for it.

Sadly, even students who have heard of testing out tend to use it for just one or two courses, not the majority of their degrees.

I believe students owe it to themselves to figure out the fastest and most affordable graduation path at their disposal, even if they don’t test out of absolutely everything.

Case Study: Earning a $139K degree for $47K at Sacred Heart University

Enough theory. Here’s a real analysis I created to illustrate how powerful your test-out savings can be. In this example, we’ll assume you want a bachelor’s degree in psychology from Sacred Heart University, a small, expensive, highly-ranked private university in Connecticut.

Here is the research necessary to calculate how much time and money you can save by testing out:

  1. Determine the “full sticker price” tuition of a four-year degree at SHU (roughly $139,000)
  2. Determine which exam formats SHU offers (CLEP, DSST, Excelsior/UExcel)
  3. Determine how many exam credits SHU allows students to earn (90)
  4. Determine the course requirements of a psych degree at SHU (degree chart)
  5. Determine how many of those required courses have CLEP, DSST, or Excelsior/UExcel exams for you to take instead (25)
  6. Determine the costs of each of those exams ($80/each for CLEP and DSST, $100/each for Excelsior/UExcel)
  7. Assume you can study for and pass a new test every 2-3 weeks

I’m building an app to crunch this data automatically, but it can all be done by hand. If you wanted to test out of as much of a psychology degree as Sacred Heart will allow, here are your potential savings:

diydegree1
Every school should post an analysis like this on their website. Since they never will, you need to perform the analysis yourself.

Again: nothing says that you HAVE to test out of every subject a school will allow. There are valid reasons for taking a course in person: class discussions, mentorship from a professor, or simply not feeling comfortable studying for a tough subject on your own (I’m looking at you, fellow non-math people!)

Rather, the point of running an analysis like this is to figure out your options. With college being as costly as it is today, there is no excuse for not giving yourself the opportunity to say “yes” or “no.”

The “Moneyball” Approach to Cutting College Costs

Here’s a useful analogy for how testing out fits into the overall picture of your college career.

Think about Billy Beane, GM of Major League Baseball’s Oakland Athletics (and the protagonist of Moneyball.) Back in 2002, Beane knew the A’s could only spend a fraction of what richer teams like the Yankees had on hand.

So instead of trying to copy those clubs, Beane figured out a brand new way to build a baseball team.

Oakland used cutting-edge statistical analysis to find “misfits” — players who were overweight, or threw the ball side-armed, or couldn’t run quickly — and constructed its entire roster with them. The other teams laughed, but here’s the best part: the A’s wanted their bargain-priced “misfits” even more than the stars they couldn’t afford.

These oddball players had undervalued talents that led to scoring runs and winning games. Most importantly, they were cheap.

The Yankees — and all the teams attempting to emulate them — were asking, “How can we buy the biggest stars?”

The A’s asked a totally different question: “How can we buy WINS (and runs, the building blocks of wins) for pennies on the dollar?” Beane didn’t care who his individual players were as long as the team was winning.

diydegree2

2002 was the perfect season to compare these approaches because the Yankees and A’s each won 103 games. The breathtaking difference? New York spent $1.5 million per win. Oakland, using their unique approach, spent just $250,000.

Here is a cost-per-credit analysis I created using average tuition figures at public and private four-year universities, as well as the cost of earning a “test-out” degree:

diydegree3

Sources: CollegeBoard and DIY Degree

Testing out is a “Moneyball” approach to higher education. “Cost per credit” is to college what “cost per win” is to baseball: the number you want to intelligently optimize. In college as in baseball, the way to win an unfair game is by taking a new approach to get the same results more efficiently.

Why You Should Look at College Like an Investment

If this post makes it sound like you’re “Frankensteining” your education, cobbling various exams and credit sources together to form a degree…you’re right. That’s exactly what I’m advocating.

This might seem strange at first, but I encourage you to look at it differently.

Why do we see college as this magical guarantee of financial success? It’s because of these oft-cited studies on how much more graduates earn over their lifetimes than non-grads. We hear sweeping statements (“people with bachelor’s degrees earn $1 million more!”) and assume that it MUST be a great investment, no matter what it costs.

Actually, we don’t just assume it — we’re explicitly told that it’s true:

“Over a lifetime, the gap in earning potential between a high-school diploma and a bachelor of arts is more than $800,000. In other words, whatever sacrifices you and your child make for [a] college education in the short term are more than repaid in the long term.”

That’s from CollegeBoard, the organization that makes the SATs. They’re basically telling you to just pay whatever a degree costs.

It’s horrible advice.

You don’t make huge financial decisions with simplistic rules like “whatever sacrifices you make are worth it in the long-term.” How is that any different than telling you to shoot first and ask questions later? No — you make huge financial decisions is by running the numbers.

Which brings us back to these studies on college graduate earnings. They aren’t “wrong,” but they are misleading.

Here’s why: earning a higher income doesn’t automatically mean you’re getting ahead. You can earn $20,000/year before college, get a $60,000/year job afterwards, and still be no better off. If you spend $100,000 for a degree (and take four years off of work to do it) you have incurred a huge financial and opportunity cost.

You took out a loan against your future earnings which must now be repaid over five, ten, maybe even fifteen or twenty years. Even then, once all the loans are repaid and you’ve earned back all the income you lost by not working, guess what? All you have done is break even!

You’re back at square one. Finally, after years of repaying loans and interest, you can start actually benefiting from the higher income you earned your degree for. Most college students don’t realize that this is what they’ve agreed to until after they graduate. They just see college as a magical guarantee of financial success. Yet whether they realize it or not, their student loans often chain them to a life of indentured servitude.

The return on an investment is inversely proportional to the time and money invested. In plain English: the longer it takes you to graduate, and the more you pay, the less valuable your degree ultimately is.

Testing Out Actually Delivers What Colleges Falsely Promise

Fortunately, the reverse is also true. The less time it takes you to graduate, and the less you pay for your degree, the more valuable it is.

By testing out, you are doing what so many students never do. You’re being strategic. You’re treating your degree as an investment, rather than a collegiate shopping spree. You’re being efficient by extracting the most value for the least cost.

Consequently, you actually will reap the rewards of higher postgraduate income. Since you didn’t take years off or incur costly loans, all of that extra money goes straight into your pocket.

If you are struggling to figure out how to afford college, I hope the test-out strategy is a breath of fresh air and a new lease on academic life. Please leave any questions in the comments. I’m happy to help anyone who needs it!

____________________

Jay Cross runs DIY Degree and helps students learn more, spend less, and graduate faster. His college acceleration strategies have been covered by Fox Business, Huffington Post, Popular Mechanics, Brazen Careerist, The Personal MBA, and I Will Teach You To Be Rich.

14 Jan 13:30

'El maíz que comen los cerdos es de más calidad que el de los bebés'

by NOA DE LA TORRE
Miguel A. Arévalo

Ya lo van a crucificar, pero lo que dice es el ABC de la biología.

Si resulta que en un tomate hay más tecnología que en un iPhone 5, ¿cómo no iba a ser la comida natural un timo? José Miguel Mulet, el enemigo declarado de Greenpeace y defensor de los alimentos transgénicos, regresa al panorama literario con nuevo libro: Comer sin miedo (Destino). «Se dice que somos lo que comemos, pero yo no me considero un mejillón», dispara.

Mulet (Dénia, 1973) es químico y profesor de biotecnología en la Universitat Politècnica de València. De ahí que su intención con este libro, que sale a la venta el 16 de enero, sea la de desmontar mitos carentes de fundamento científico, como lo es la idea de que es posible comer alimentos absolutamente naturales.

«La comida es una expresión cultural que cambia a medida que nosotros cambiamos y, en tanto que expresión cultural, es artificial». Un ejemplo: las frutas y verduras que consumimos en la actualidad no tienen más de 20 o 30 años. ¿Y qué hay de la zanahoria? En sus orígenes no era naranja, sino de un color parecido a la chirivía. La historia de esta hortaliza la cambian los holandeses cuando se la quisieron regalar a su Familia Real.

Más segura y con mayor variedad

«La comida es ahora más segura y hay más variedad, a pesar de que la gente tiende a idealizar esos momentos de la niñez comiendo en casa de la abuela». ¿Era más sana la comida de antes? No, según Mulet. «Queremos la última tecnología para todo pero la comida la queremos anticuada», destaca el científico valenciano. Basta con mirar las estanterías de un supermercado, en las que abundan las etiquetas de rústico, artesano... y ecológico.

«La comida ecológica no es más sana», defiende Mulet, muy crítico con la «propaganda engañosa». «No lo dice ni la Unión Europea», sostiene. Es más, la normativa por la que se rigen los llamados productos ecológicos ni siquiera mira su impacto ambiental. «El único requisito es que todo lo que pongas en el cultivo sea natural, por lo que permite el cobre como fungicida, cuando es muy tóxico». En opinión de Mulet, la normativa europea «responde a motivos ideológicos».

Aquí es cuando nos topamos con los transgénicos. Su definición de andar por casa es la de «organismos que llevan un trozo de ADN de otro». En Europa está permitido sembrar el maíz BT -España, por cierto, es el mayor productor europeo de maíz transgénico-, que «lleva un gen de bacteria que mata las plagas».

Así que, como subraya Mulet, el cultivo de este maíz transgénico evita el uso de insecticidas, es más barato, emite menos dióxido de carbono... Ahora bien, el rechazo social a los transgénicos provoca en Europea que este maíz BT se destine exclusivamente al pienso. Resultado: «El maíz que comen los cerdos es de más calidad que el que damos a los bebés».

Crisis del pepino

Es más, Mulet cuestiona incluso el nivel de seguridad alimentaria asociado a la alimentación ecológica, pues, por ejemplo, la prohibición de pesticidas favorece las plagas, de la misma manera que la venta a pequeña escala dificulta el control de estos productos.

La crisis del pepino, sin ir más lejos, se originó por unos brotes para ensalada distribuidos en cadenas de alimentación ecológica, según recuerda el científico. «Desde entonces ha habido cuatro o cinco alarmas relacionadas con la alimentación ecológica».

Y hablando de seguridad, ¿qué hay de la moda de comer alimentos casi crudos? Ojo, porque el apreciado steak tartar «es una comida poco segura desde un punto de vista microbiológico». Mulet insiste en que «cocinar es la medida de higiene más elemental». Así que todo lo que tenga que ver con la carne picada (léase hamburguesas...), más vale que esté bien cocinado.

¿Qué nos deparará el futuro? Carne sintética, seguro. «Una vaca pastando contamina mucho». Dicho con otras palabras, es más contaminante una ganadería ecológica que un establo, donde se puede filtrar el gas metano que emite la vaca, por ejemplo.

Zanahoria
En sus orígenes no era naranja, sino que tenía más bien que ver con el color entre blanco y amarillo de la chirivía. Los holandeses la hicieron naranja en honor a su Casa Real.
Tomate
El original es pequeño, nace en los Andes y es tóxico. «El tomate canario se ha mejorado genéticamente tras años de investigación, como cualquier especie cultivada, que es artificial».
Berenjenas
Antes tenía espinas pero, tras una selección artificial, ahora es más bonita pero con menos antioxidantes.
14 Jan 13:19

Nace el partido político de Santiago Abascal y Ortega Lara

Miguel A. Arévalo

Parece que estas europeas van a ser propicias para las terceras fuerzas...

Orega Lara y Santiago Abascal durante la manifestación convocada por DENAES en apoyo a las víctimas del terrorismo | LD

Luis F. Quintero 2014-01-14

El pasado 6 de diciembre, la madrileña plaza de la República Dominicana acogió la manifestación convocada por la fundación DENAES para dar apoyo a las víctimas del terrorismo después de que el Tribunal Europeo de Derechos Humanos de Estrasburgo (TEDH) echara abajo la doctrina Parot en el caso de Inés del Río. Posteriormente los órganos jurisdiccionales españoles, con el previo anuncio de Interior, se apresuraron a excarcelar a etarras a decenas, ante la mirada atónita de las víctimas del terrorismo.

En aquella manifestación, que logró reunir a la presidenta de la AVT, Ángeles Pedraza, con Daniel Portero, presidente de Dignidad y Justicia, la presidenta de Covite, Consuelo Ordóñez o el presidente de Voces contra el Terrorismo (VcT) Francisco José Alcaraz, el discurso de José Antonio Ortega Lara fue uno de los más aplaudidos. También recibió una calurosa acogida el de Santiago Abascal a quien los asistentes reclamaban una y otra vez que diera un paso adelante para formar un partido. Este martes podemos decir que el partido de Ortega Lara y Santiago Abascal es una realidad.

En Libertad Digital adelantamos el pasado 5 de enero que este nuevo partido se formaría en "cuestión de días". Según ha anunciado este martes en un comunicado la formación, el próximo jueves 16 de enero tendrá lugar la presentación oficial de un "nuevo partido político de ámbito nacional". Sus impulsores son:

  • Ignacio Camuñas
  • José Luís González Quirós
  • Santiago Abascal
  • José Antonio Ortega Lara
  • Ana Velasco Vidal-Abarca
  • María Jesús Prieto
  • Cristina Seguí
  • Iván Espinosa de los Monteros
  • Enrique Álvarez

No será hasta este próximo jueves cuando se conozca su ideario, el nombre y la imagen corporativa.

14 Jan 13:06

Un gimnasio para el cerebro

by MARTA BELVER

Cansados de los corsés administrativos y de las empresas que sólo buscan hacer negocio con la salud de los demás, decidieron crear su Unidad de Memoria y de Entrenamiento Cerebral. Aquí no sólo atienden a pacientes que padecen enfermedades degenerativas; también, como el que se emplea en mantener su cuerpo a tono, ejercitan la mente de quienes ya no tienen una actividad intelectual dinámica.

A este peculiar gimnasio para la cabeza, situado en Alcorcón, se entra de la mano de Fabio Cabello, terapeuta ocupacional especializado en geriatría y gerontología, y de Raquel Gutiérrez, psicóloga experta en neuropsicología. Uniendo sus experiencias laborales de años en residencias y centros tanto públicos como privados han dado forma a «un recurso asistencial que no existía» y «a un precio apropiado».

«Detectamos que había una necesidad de un tipo de tratamiento más específico para aquellas personas que sufren un deterioro cognitivo leve o están al inicio de alguna patología», explica Cabello. «Lo importante es hacer una valoración, analizar los componentes cognitivos afectados y trabajar sobre ellos, como por ejemplo las deficiencias a nivel de atención o de la capacidad para recordar», añaden los responsables de la Unidad de Memoria.

Además, ofrecen ayuda a las personas jubiladas para que den el paso del estrés laboral a la inactividad sin traumas y a los trabajadores que se bloquean mentalmente frente a un ascenso. «Los problemas emocionales, como los estados depresivos, cada vez van a ir a peor, a lo que se suma el tema del consumo de sustancias; por ahí va a venir el aumento de la demanda», auguran estos expertos.

Cabello advierte de que la prevención del deterioro de la memoria «es muy importante, pero la gente no está concienciada. Esto es como el que hace ejercicio físico para mantenerse en forma, pero de momento prácticamente nadie lo ve como algo necesario».

Para tratar de invertir esta tendencia, en el perfil de Facebook de la Unidad de Memoria se proponen diariamente ejercicios de entrenamiento cerebral. Por ejemplo: determinar qué número es el que más se repite en una secuencia de varios guarismos; encontrar al gato en una foto de un zoo atiborrado de animales; pensar en palabras que acaben en -tre... El contador on line de la página es muy elocuente: cada semana 90.000 personas se ponen metafóricamente el chandal para afrontar el reto de activar la mente.

Con todo, lo más innovador del trabajo que desarrollan estos dos profesionales son las terapias en familia (la próxima será el 1 de febrero). «En ellas podemos actuar de forma directa sobre los cuidadores y ayudarles a que conozcan mejor la enfermedad, pero no en el plano teórico, sino con recursos terapéuticos y habilidades sociales», argumentan.

A Fabio Cabello y a Raquel Gutiérrez, que llevan trabajando desde septiembre sin jefes que les corten las alas, ya les han llegado propuestas para abrir franquicias de la Unidad de Memoria y de Entrenamiento Cerebral. Animan a quienes se lo han propuesto a que empiecen también a trabajar por su cuenta, pero los asuntos administrativos no les interesan. Ellos prefieren invertir su tiempo en entrenar mentes.

13 Jan 09:05

[x]

Miguel A. Arévalo

O un ordenador lento...



[x]

08 Jan 11:14

Fed Admits: We Don’t Know How QE Works

by Mark Thornton
Miguel A. Arévalo

Jeje, no es raro, la mayoría de los políticos pone en marcha políticas que realmente no sabe qué resultado producirán.

Amazing! According to a Reuters Report, President of the Federal Reserve Bank of New York William Dudley admitted that the Fed does not know how Quantitative Easing policies work to help the economy and that they don’t have a well developed model of the economy that incorporates a “realistic” financial sector.

Yet “we still don’t have well-developed macro-models that incorporate a realistic financial sector,’ William Dudley, president of the New York Fed, told an economics conference.

“We don’t understand fully how large-scale asset purchase programs work to ease financial market conditions, there’s still a lot of debate …” he said. “Is it the effect of the purchases on the portfolios of private investors, or alternatively is the major channel one of signaling?”

03 Jan 09:46

Science Has Almost Caught Up with Science Fiction

by David Friedman
Miguel A. Arévalo

Interesante, no conocía los corpúsculos polares, definitivamente seguiría existiendo una duda ética (y práctica), pero de mucho menor orden que el asesinato.

Beyond This Horizon, an early Heinlein novel, describes a technology for what I like to describe as libertarian eugenics, a way in which a couple can choose, among the children they could have, which one they do have. The central trick is a way of separately analyzing the genes of sperm and egg before combining them.

The obvious problem is how to analyze a cell without damaging it. Heinlein's ingenious solution takes advantage of the fact that egg and sperm each contain half of the full set of genes of the cell from which it is derived—and the process that produces an egg or sperm also produces other bodies with the rest of genes. Analyse an ordinary cell, analyse the extra body, subtract, and you now know what genes are in egg or sperm.

According to a news story published yesterday, someone is finally attempting a version of Heinlein's idea, looking at the polar body that contains the genes that are not in the egg in order to deduce the genes that are. At least, that is what the story seems to be describing, although there are not enough to details to be certain.

Heinlein published the novel, and the idea, in 1942.
03 Jan 09:20

Evaluating Controversial Claims

by David Friedman
Miguel A. Arévalo

Me falta el que es para mi el mejor punto de vista... la intuición; es decir la evaluación subjetiva con datos anecdóticos y, en caso de que esos datos vengan de alguien, ver en persona cómo expresa sus opiniones. El cerebro lleva muchos milenios evolucionando para tomar decisiones esencialmente buenas para su supervivencia (aunque no sean las óptimas) con pocos datos y tiempo así que creo que es bueno darle una oportunidad de vez en cuando :-D

Someone makes a scientific claim, typically as an argument for some policy. Examples would be current claims with regard to global warming, claims fifty years ago about the consequences of population growth, claims early in the Obama administration about the need for large deficits to bring down unemployment. There are at least four different ways in which an interested observer can decide whether or not to believe the claim.

1. Partisanship. If you support the policy, believe the claim. If you don't, don't. This is probably the most common approach.

2. Evaluate the arguments for yourself. This is  the most entertaining and educational approach but no more reliable than the first—and likely to give the same answer. There is always controversy about the claim among people better equipped to evaluate it than the random observer, although one side or both may try to deny it. In the case of global warming, the relevant claim is not merely that temperatures are going up, or that the reason is human activity, or that they can be expected to go up by enough to cause serious net costs, but all of those plus the additional claim that there are ways of reducing the increase that are worth their cost. To evaluate all of that you need a reasonably expert knowledge of climatology, statistics, ecology, economics, and probably two or three other fields I have not thought of. Since you don't have all that, you end up believing whichever arguments you want to believe.

3. The argument from authority. You try to figure out what the consensus of the people who are experts is or what some authoritative source of information says. An outsider trying to figure out what professionals in a field believe is at risk of overvaluing whatever position has the most support from public sources of information, such as the mass media, or has done the best job of getting its supporters onto the committees of scientific organizations that put out public statements. And even if he could figure it out for one field, that isn't sufficient. Again taking the global warming case, it is not enough to know what the consensus of the climatologists is, even if you can separate the facts from the puffery on that subject. Climatologists are not economists, so  could be correct about the expected temperature increase and wrong about the magnitude or even the sign of its consequences. Economists are not ecologists, so might show the costs they are looking at to be insignificant while missing the effects of climate change on other species. I discussed problems with this approach at greater length in an earlier post.

4. Prediction. Once such a controversy has been going for a while, partisans have a track record. If they have made confident predictions that turned out to be wrong, that is good evidence that they are either dishonest or arguing from an incorrect theory. Figuring out whether the arguments for a theory are right or wrong is much harder than finding out what that theory predicted. Sometimes all the latter takes is  a book or article by its supporters written a few years back.

The clearest case is the population hysteria of the 1960's. Paul Ehrlich's Population Bomb, published in 1968, confidently predicted mass famine in the third world over the next decade, with hundreds of millions of people starving to death. Not only did it not happen, the real world moved in the opposite direction, with calorie consumption per capita in the third world going up, not down. That is very strong evidence that Ehrlich can not be trusted. It is somewhat weaker evidence that the movement of which he was part, whose members generally took him and his arguments seriously, can not be trusted.

Weaker examples apply to my other two cases. Early in Obama's first term, the Administration offered predictions of what the unemployment rate would be without a stimulus and how much lower it would be with the stimulus that the Administration wanted and got. Actual unemployment rates for the next several years were higher with the stimulus than predicted without it. That does not tell us whether the stimulus was a good policy. It is possible, as its supporters argued after the fact, that things were simply worse than they thought. But it is good evidence that predictions made by Administration economists can not be trusted, that either they were deliberately fudging the results or were using models much less reliable than they claimed.

For the case of global warming, we have the IPCC's repeated overpredictions of global temperatures, hurricane rates that are strikingly lower, not strikingly higher, this year than the average, and a number of other predictions to which the real world has failed to conform. Again, that does not show that the underlying argument is wrong. It does show that the people and models that have been generating the mistaken predictions cannot be trusted.

Which is about the most that the interested outside observer can hope to learn.

---

P.S. Not surprisingly, a lot of the comments on this post focus on the particular case of global warming rather than the general argument. One commenter provides a link to an article by Chris Landsea on the effect of global warming on the frequency and strength of hurricanes. One conclusion is that global warming by the end of the century might result in a slight increase in the strength of hurricanes and might also result in a substantial decrease in their frequency. 

Another is that the historical record provides no support for the claim that hurricanes have been becoming more frequent or stronger over the past century or so. The reasons that hurricane damage has trended up is that the total value of property in coastal counties has increased. The reason that the number of recorded hurricanes has trended up is probably the large improvement in our ability to detect hurricanes, in particular ones that fail to make land. For details see the article.
17 Dec 21:22

BCV implementa “Inflación Fuerte” y corre dos ceros los números de inflación

by Chucho
Miguel A. Arévalo

Jeje, en Europa podríamos inventar la Inflación Fuerte :-P

BCV

El Banco Central de Venezuela (BCV) anunció el día de hoy la implementación de la “Inflación Fuerte”, un sistema que correría dos ceros las cifras oficiales de inflación del país. Eso convertiría la inflación acumulada de 45% en apenas 0,45%.

“Si hay un Bolívar Fuerte, tiene que haber uan inflación fuerte, por eso ahora en vez de haber 50% de inflación, habrá 0,5% ¡SOMOS UNA POTENCIA Y ESTAMOS BLINDADOS!” Exclamó Eudomar Tovar mientras hacía señas a los periodistas para que no vieran cómo sacaban oro de las reservas para venderlas en el Centro de Caracas. “Creo que por el 2013 estamos listos: Logramos bajar los precios por decreto, ahora le corrimos la coma a la inflación y para el 2014 simplemente vamos a dejar de decir los números económicos porque ya se acabó la inflación de por vida y si les parece que eso no tiene sentido, pues váyanse del país. Ya van a ver, los envidiosos dirán que es mentira, pero el año que viene no habrá inflación de 1000% sino de 10% como toda Latinoamérica.”

Intentamos contactar a algunos economistas para saber su opinión sobre estas medidas, pero recibimos la visita de un representante de la fiscalía que nos prohibió dar cualquier información sobre economía básica. “La economía es algo subjetivo, no es una ciencia como todos creen. Por eso les queda rotundamente prohibida la publicación de información económica de esa que la burguesía llama ‘lógica’. De ahora en adelante, van a ir comiquitas de Gaturro, Pepita y Olafo en las secciones de economía de los medios ¡Así que pónganse a dibujar ya!” Manifestó el representante de la fiscalía mientras pegaba caritas felices en todos los ceros de los precios.

link_instalan_pantalla

13 Dec 15:15

New Dad Survival Guide: The Mindset

by Brett
Miguel A. Arévalo

A child demolishes the old structure of your life — the one that revolved around you — and it takes some time to build a new structure in its place – one that revolves around your child. Until the new structure is complete, life can feel a little shaky.

New Dad Header 1

Yesterday we talked about the “skillset” needed by new dads. Things like changing a diaper and burping a baby will come pretty easily with practice. Keeping a positive mindset is harder, and much more important. Here are my suggestions for maintaining a healthy perspective and preserving your sanity after a baby bomb drops from the sky and blows up your old life.

New Dad Survival Guide: The Mindset

Realize you were made to do this. There’s a popular perception that being an involved dad is an entirely modern phenomenon — and that primitive men just deposited their seed and went on their way. But recent research on the neurological and physiological changes men experience before their baby arrives shows that male parental involvement is in fact deeply ingrained.

Three weeks before your baby is born, your testosterone levels will fall by about a third in preparation for you taking on a more nurturing role. (They’ll rise back to pre-birth levels about six weeks after your baby arrives.) At the same time, your brain will actually change too – becoming more adept at homing on a baby’s cries. As the researcher who discovered dads’ drop in T put it: “Raising human offspring is such an effort that it is cooperative by necessity, and our study shows that human fathers are biologically wired to help with the job.”

Contrary to the popular cultural stereotype of the bumbling dad who has to constantly be rescued by Mom, men can be just as competent at childcare as women.

Realize there’s no “normal” way to feel. People love the image of the man — especially one who’s typically tough and stoic – who cries upon first seeing his progeny. And guys will often talk about the overwhelming love they felt for their newborn as soon as it emerged from the womb. And if that happens to you, great. But realize there’s a whole host of different emotional reactions you might experience. I didn’t cry when either of my kids were born. I was just like, “Whoa, cool, there’s my baby!” I did feel a surge of love and pride when I first held them, but you might not even feel that. Instead your baby might seem like a weird stranger — a tiny alien that suddenly beamed down into your life. That’s okay too.

The feelings you have for your baby can be compared to a romantic marriage versus an arranged marriage. In the former, you fall in love first, and then make a commitment; in the latter, you make the commitment, and then fall in love. Both kinds can and do work just fine. So don’t worry about or think something is wrong with you if you don’t feel over the moon about your kid right off the bat – as you involve yourself in his life, your love for him will grow deeper and deeper with time.

It’s also very normal to have fluctuating feelings about your child after you take her home and as she grows up. I feel like there’s this idea out there that you’re going to love your child intensely every moment of every day. That’s decidedly not the case. You’ll always have a steady, unchanging love for them, but sometimes you’ll feel incredibly angry at them or just plain ambivalent. You’ll even have moments where you don’t really like them. Your love for your kid is like a little ember that’s always there, and sometimes explodes into a fire of overwhelming joy and pride, and sometimes smolders very faintly. All very normal.

Get hands-on right away. If you’re feeling awkward or unsure about being a dad, the best thing you can do is to ignore that feeling and spend as much time with your baby as you can. If you heed that awkward feeling and avoid the kid, it turns into a cycle where you feel more awkward so you avoid him more, making you feel more awkward, and so on. On the other hand, the more hands-on you get, the more comfortable and bonded you feel with the baby, and the more time you want to spend with him, and that becomes a virtuous cycle that keeps on increasing your confidence.

Be empathetic. A crying baby can be incredibly frustrating. One of the things that helped me stay calm was trying to put myself in my baby’s tiny, tiny shoes. I would think about the fact that these were their very first moments, days, and weeks on this planet, and that every single sight, sound, and smell was brand new. There’s plenty for a baby to be disoriented and upset by.

Even more importantly, I’d think about the fact that when something was bothering them, they had no other way to let us know except crying. Imagine if you felt sick or had an itch or had something scratching you, and you couldn’t do anything about it or ask anybody for help. It’s hard out there for a baby.

Realize they only get more and more interesting. Ernest Hemingway once said, “To be a successful father . . . there’s one absolute rule: when you have a kid, don’t look at it for the first two years.” Now I don’t endorse Papa’s tip, but it is true that while many women love babies, men generally enjoy their children more as they get older. Babies are cute, and sometimes I even have those moments where I wish I could keep them that little forever, but let’s be honest, they’re kind of boring. They’re these cute, but super stoic, unsmiling creatures who only sleep, eat, and poop.

But the good news is that they steadily get more interesting by the week. It’s amazing to watch them gain a personality and begin interacting with you bit by bit. The more they’re able to engage with you, the more enjoyable they become. (On the downside, the more they’re awake, able to control their body, and verbalize, the wider their arsenal of exasperating antics!).

Manage your expectations. Anger and unhappiness of all kinds generally have one root: a mismatch between expectations and reality. The more you think fathering is going to revolve around happily tending to a calm, smiling baby, the more frustrated you’re going to be every time she starts hollering. The more you set your mind that it’s going to be tough, the more you’ll be able to calmly roll with the punches.

This is especially important when it comes to their sleep schedule. Once they start sleeping through the night, it’s tempting to think, “Whew, this is great. Now things can get back to normal!” All goes well for a few months…then they wake up in the middle of the night because they’re teething or sick or who knows why. If you’re not prepared for this to happen, it can really bum you out. Count on it, and you’ll find it easier to deal. Don’t worry, they’ll get back on track again. (But don’t count on it!)

Embrace the challenge. I’d recommend going even further than the recommendation in the point above; rather than just accepting that it’s going to be hard, relish that fact a little. There are so few challenges in our modern life, that I really enjoyed and appreciated getting to weather this one. The more you see fatherhood as a challenge that will test your mettle and give you a chance to rise to the occasion, the easier it will be to maintain a positive mindset, and the more satisfying the experience will be.

Perspective 2

We’re you’re feeling tired and rundown, it helps to keeps things in perspective. Being a new dad is hard, but it’s not say, Iwo Jima hard.

Keep it all in perspective. While it’s good to expect the worst, also realize that in the grand scheme of things, it’s really not that hard. There will definitely be times when you’re going to feel super tired and rundown from the late nights. To keep myself from sinking into despondency at such moments, I would think something like, “Well, at least I’m not tired from trying to stay awake in a freezing cold trench at the Battle of the Bulge….Right, this isn’t so bad. I can do this.”

Embrace this season of your life. When it comes to minimizing your frustration and maximizing your happiness with fatherhood, I cannot emphasize this point enough. It’s completely normal to have moments where you miss your childless life after you have a baby. But the more you concentrate on what you can’t do during this season of your life, instead of what you can do, the more unsatisfied and restless you will feel. Sure, you can’t go out with your friends as much, and doing pretty much anything involves a lot of rigmarole, but, you get to watch a human being’s development moment by moment and play a role in shaping that development. You’re holding a living legacy in your arms. That’s pretty awesome.

Realize that your kid is not going to “make” you happy. Studies showing that having a kid decreases your happiness get passed around a lot these days, but other research has in fact shown the opposite, and the results are actually inconclusive. If you’re worried about how your new baby is going to affect your happiness, keep two things in mind.

First, happiness is overrated. I realize this is a loaded statement that requires a lot more unpacking than I have room for in this post, but suffice it to say that your moment-to-moment feeling of well-being is much less important than your overall sense of satisfaction and fulfillment. If you asked a man who was in the middle of climbing Everest – a man in the midst of an incredibly painful and strenuous challenge — whether he felt “happy,” the answer would probably be no. But despite the pain, the climb is probably bringing him a deep satisfaction, and for the rest of his life, he will experience the fulfillment of being able to say, “I climbed that mountain.” Such is parenting.

Second, happiness is a choice, not something compelled by circumstances or your children. Blogger and dad Matt Walsh recently penned a brilliant piece on this truism. The whole thing is worth reading, but here’s a choice excerpt:

“My kids don’t make my happiness. That isn’t their job. My happiness isn’t a responsibility that falls on their tiny little shoulders. Kids come into this world helpless, naked, and needing, yet so many of us immediately shove them into the Happiness Factory and bark commands. “Get on the assembly line and build me some happiness! Quick! Do your duty, sir!”

The joy and happiness of parenting is like the joy and happiness that can be found in many good things: it comes from sacrifice, self-denial, and self-giving. It comes with work and effort. I have to be the sort of person who finds happiness in giving, and I will not automatically be that sort of person just because I had sex and made a couple of babies. In other words, my kids don’t make me happy to be a parent; I have to make me happy to be a parent. And I am. I am beyond words. But that happiness will decrease if I become more selfish, and it will increase if I become less selfish. If you want your kids to make you happy, you are asking your kids to make you less selfish. That is a demand that is, all at once, incredibly stupid, laughably absurd, and profoundly abusive.

And then maybe we should stop worrying so much about this happiness thing, anyway. I think the happiest people are the ones who spend the least amount of time whining about their desire to be made happy. They do a thing because it’s right, or because they have a duty to do it, or because it is interesting, or beautiful, or enlightening. They choose to find happiness amidst it all, but that was never the point. They aim beyond mere enjoyment, pleasure and satisfaction. If your own happiness is the Alpha and Omega of your life, you’ll never do anything important or become anything significant in this world. Ironically, you’ll also never be happy.”

Keep doing the things you love. While trying to replicate your childless life once you have a kid is a recipe for frustration and poor parenting, you shouldn’t entirely give up the things you once loved either. Have a couple non-negotiables that you make it a priority to keep up, at least after the craziness of the first few months subsides.

Last month when I was rock climbing I saw a couple who brought along their 2-month-old and were happy as can be. I felt like shaking their hands.

Fitness has been something I’ve taken a lot more seriously this year, and I was back going to the gym every day two weeks after Scout arrived. I was super tired from the late nights, but it’s something I love to do and hate to miss. It helped buoy my sense of well-being, and it gave me a small sense of routine when the rest of my life was a little chaotic.

Isolating yourself in your house with just your baby and your wife and a cabinet full of junk food is not going make you feel good about life or fatherhood.

Keep up your relationship with your significant other. Speaking of holding on to the things you love, don’t let your relationship with your wife/girlfriend go to pot either. Your energies will definitely turn away from each other and towards taking care of the new arrival, and you won’t have much time to just hang out and enjoy each other’s company. And because much of your willpower will be depleted in restraining yourself from throwing the baby out the window, you’ll have less of it for controlling your behavior towards each other. You may find yourselves bickering more and being more prone to losing your patience with each other.

So it’s important to empathize with your partner’s state of willpower depletion, be more tolerant of their foibles, and carve out time — however short — to re-connect each day. If you work and your wife stays home, take the baby off her hands as soon as you come in the door. Watch some TV and do some cuddling and talking when the baby sleeps at night. Remind yourselves to stop and hug. Once her doctor gives her the greenlight after six weeks to resume sex, try to fit that activity in when you can. After a couple months, find a babysitter and have a date night.

The best possible thing you can do for your kid is to have a good relationship with his mother. Don’t neglect it.

This too, shall pass. Adopt this as your motto during this time. When you’re in the trough of the newborn phase, it can feel like the tiredness and stress are going to last forever. But of course it doesn’t.

A child demolishes the old structure of your life — the one that revolved around you — and it takes some time to build a new structure in its place – one that revolves around your child. Until the new structure is complete, life can feel a little shaky.

For my money, the lowest time is around the 6-7 week mark. You’ve had a month and a half of greatly disrupted sleep, the rush of having a newborn that kept you going the first few weeks has evaporated, and while the baby requires a ton of care, she isn’t giving much in return, unless you count blank stares and dirty diapers. You really have to run on duty and commitment rather than pure affection during that time.

But then things turn a corner. With any luck, your baby will start sleeping through the night around 8 weeks. Once you start consistently getting a full night sleep, you can handle any fussiness that arises during the day. She’ll start smiling around the same time too, and the first little hints of personality will emerge.

Around 6 months, the new structure of your life will have tenuously been erected. You feel much more confident and less awkward. Being a dad starts to feel like your new normal.

A year out, you feel like a pro, and being a dad is your new normal. As more and more months go by, you, your wife, and your son or daughter find a really happy groove and routine.

But by that time of course, it’s time to have another baby, throwing everything back into chaos! But next time around, you’ll be a seasoned veteran.

Best of luck to you on this incredible journey.

 

No related posts.


    






13 Dec 08:35

When the specs change

by sharhalakis
Miguel A. Arévalo

Agile programming :-D

by Seb

11 Dec 08:12

Cost Capping: 5 years too late or just a pipe dream?

by Matthew Somerfield
Miguel A. Arévalo

Jojo, parametros del reglamento de Formula 1 que son indicados en teraflops ... pero a mitad de temporada hay que cambiar unos neumáticos por inseguros... algo hacen mal.

Following a meeting in Paris yesterday the FIA made several changes in regards to the regulations. They were apparently met with unanimous support from the newly formed Strategy Group and Formula One commission. The reason I write this article is the first announcement to come from the meeting:

Cost Cap

The principle of a global cost cap has been adopted.  The limit will be applied from January 2015.  A working group will be established within the coming days comprising the FIA, representatives of the Commercial Rights Holder and Team Representatives.
The objective of the working group will be to have regulations approved by the end of June 2014.

If we recall 2010, a certain Max Mosley then president of the FIA tried to make Formula One see sense and adopt said cost cap.  Max's proposals were skewed to try make the big teams see sense, giving more technical freedom to those that could spend under the proposed 40m cap.

No-one bought into Max's proposals as they believed it would create a 2 tier Formula and the 40m limit was unattainable.  The problem with a cost cap in Formula One is how it effects the sport initially and that's due to economic scale.  The 2 tier formula that the teams seemingly feared back in 09/10 is actually already apparent in the sport, it is however 3 tiers.

Red Bull, Ferrari, Mercedes and McLaren sit in the upper echelons able to attract the necessary funding or sponsorship that is needed to compete at the highest level (somewhere in the region of 100-150m per season).

Lotus, Force India, Sauber, Williams & Toro Rosso are what we ubiquitously refer to as the 'midfield' teams and with the exception of Toro Rosso are what I would call independents, ie run by the backing of their owners, pay drivers and any sponsorship they can accrue.

Caterham & Marussia are the tail enders or the lap markers starting from nothing when they arrived in the sport in 2010.  Both teams have seen several guises but the core ownership/management remains intact.

The 2 teams at the bottom of the pile arrived in Formula One (along with HRT) bright eyed and bushy tailed fully expecting the rest to have to play on a level playing field to them in the forthcoming seasons.  Their feasibility studies for entering Formula One were firmly targeted at the 40m budget cap and when this didn't arrive they had 2 choices: Re-Budget and spend beyond their means or pack up their toys and move on.  The latter would have course had financial ramifications anyway with the FIA/FOM going after their share of damages for not competing but it seemed all accepted their fate and stumbled on.

I think we can all agree that a budget cap isn't necessarily a bad idea, afterall we may still have 3 more manufacturers onboard had the FIA / Formula One acted more swiftly at the start of the financial crisis (BMW, Honda and Toyota).  All of which made their exit from the sport citing financial pressures most of which came from them over spending, in order to overhaul the already commanding positions the established teams had.

There is a certain ebb and flow in Formula One every time a large regulation change occurs which see's some teams move between the tiers, able either to establish themselves as a more potent force or languish in the lower bowels.  The last big regulation shake up saw BrawnGP emerge as the front runner but this came off the back of the team abandoning their 2008 campaign, using 2 full scale wind tunnels (Brackley & Japan) and assimilating most of the Super Aguri staff (which is where the Double Deck Diffuser idea was born).  Red Bull however for me were the team that made the largest leap, which had the DDD been banned at the start of 09 I feel we could be seeing Vettel/Red Bull holding 5 consecutive Championships now and not 4.  Their rise is somewhat disconcerting for the other teams but was born out of the way they invested their money on the run up to the 2009 rule changes.

We must remember that prior to 2009 the way in which cars were designed and then improved was firmly skewed toward a testing regiment, something that suited the likes of Ferrari and McLaren as they were adept at producing mass quantities of physical parts to test.  Red Bull settled into the new regulations quicker than the others as they had pre prepared for the new challenge, instead of adjusting their tunnel to a 50% scale like most of their rivals and adopting the same scales for tooling they invested in the 60% route.  A much more accurate investment allowing correlation between the test environments of CFD and the Wind Tunnel to that at the track.  This gave a head start to Red Bull and though 10% may not seem like a significant number, in terms of scale it is, stall and boundary layer turning turbulent are but a few elements that more easily definable with the larger scale.  Scaling can of course be mathematically calculated when given the right information from 100% testing (correlation) the problem comes when you get inconsisent results on a constant basis from the prediction to the 100% model.

This has been prevalent during this last regulation set when teams have had to design area's of the car with the use of the exhaust plume in mind.  The problem with using the exhaust is that it runs at a significantly different temperature to the atmosphere but is also ballistic.  You can't simply build an engine 60% the size and use it in the wind tunnel and so teams have to improvise and fabricate other methods to simulate the exhausts effects.  Having the ability to do this effectively whilst also getting the jump on the rest of the field by moving to 60% scaling earlier gave Red Bull the impetus to overhaul some of the established names. 

So as we can see, sometimes it's not about how much you spend but the context in which that money is used, utilising the right equipment at the right time can surmount to a huge swing in the positioning of a team in the field.  Red Bull managed to outscore BrawnGP once they had their DDD but as we know financially the BrawnGP team were suffering from the pull out of Honda and now left with the legacy of staff cuts and 50% modelling tools from which to relearn their trade (having used 100% tunnels up until the start of the 09 regulations).  Red Bull meanwhile had designed their RB5 from the ground up in 60% plying their trade and learning more than anyone about the necessary tools required to extrapolate results.

The ban on in-season testing also led Formula One to proliferate it' use of CFD, adding another tool to the pre build / test phase that can determine whether a solution is worthwhile investing time and money on.

For 2014 the Sporting Regulations have been amended to include CFD usage for the first time putting a limit on the amount of Teraflops that can be used for CFD at 30 Teraflops in an 8 week cycle.  Although this is the first time I've seen it written into the rules publicly it's long been assumed that the FIA/Teams have had an agreement bordering on 40 Teraflops in an 8 week cycle.  Whether teams have adhered to this with it not actually being part of the Sporting/Technical Regulations is a matter for debate.

The rules regarding the precise use of the Wind Tunnel has also been included in the appendix of the Sporting Regulations for 2014 with a maximum of 80 runs per week and 60 hours of tunnel occupancy allowable.  (The 60% scaling and 50m/s regulations remain)

Of course these changes have been made and stipulated by the FIA in the Sporting Regulations as F1 will return to in-season testing next season.  The re-introduction of track testing brings with it a significant cost and will also have an impact on how the teams face in-season development.  The additional limitations imposed on the scale tools is an attempt to negate the advantages gained by teams at the track, I do however fear that some teams will make significant gains because of this and especially those that can spend the most on actually testing full scale development parts.

Just as we saw with the rise of Red Bull in 2009 the regulatory changes that have been coming for some time will certainly benefit some teams more than others.  It therefore begs the question is a budget cap what Formula One needs? I'd suggest it's moreover a constriction of resources IF we want all the teams to have a similar playing field.  However I'd almost certainly suggest that I prefer a situation where teams are able to express theirselves individually and make decisions and indeed mistakes on what is the best method for them.  I'd certainly like to be a fly on the wall in the working group that is tasked to arrange the cost cap and have a plan in place for June.  Everyone that is privy to this group are opposed to one anothers agendas and finding common ground could be somewhat of a challenge.

Whatever figure the group decide to utilise I suspect it will still be too high for some of the smaller or lesser funded teams to generate. Furthermore remember the saying "There's more than one way to skin a cat"? If the teams want to spend money, they will find a way.  EVERYONE that works within a Formula One team does so because they are the best at what they do, F1 teams have been finding ways around the FIA's rules for decades and this won't stop with a budget cap.
11 Dec 08:09

El IPCC traducido – ahora lo entiendes.

by plazaeme
Miguel A. Arévalo

Jeje, una traducción que ya hice hace más de una década cuando el IPCC ya confesó que sus modelos fallaban y no tenían muy claro por qué lo hacía (pero que tenían modelos mejores y más catastróficos).

Nota previa: Traduzco esto por encargo del jefe (le voy a matar). Está escrito en tono de broma (y tiene mucha coña), pero tanto Nova como Sherrington saben muy bien de lo que hablan, y cada uno de los puntos que hacen es perfectamente defendible en serio. De hecho son puntos que suelen ser defendidos muy en serio. Es un gran resumen de cómo -jugando con las palabras- el IPCC puede engañar sin mentir. Si te va la marcha, aquí tienes otro más técnico que sobrepasa la linde del mero juego de palabras: ¿Soportan las observaciones que usa el IPCC las afirmaciones que hace el IPCC?

Traducido de joannenova.com.au.

El IPCC traducido – ahora lo entiendes.

Geoff Sherrington y JoNova.

Acaban de filtrar el primer borrador del Informe Síntesis del PCC    (gracias Tallbloke) . Es parte del gran 5º Informe ( las otras partes filtradas aquí). El Informe Síntesis se supone que resume la ciencia. En el mundo real el tema de moda es “la pausa”, o hiato en el calentamiento, que el IPCC ya no puede seguir ignorando. En cambio, los maestros de los mantras alcanzan nuevas fronteras diciendo cosas son a la vez técnicamente correctas, pero evitan señalar lo obvio. Con un poco de suerte, les echaremos una mano. :)

Traduciendo el cuento IPCC:

“The rate of warming of the observed global-mean surface temperature has been smaller over the past 15 years (1998-2012) than over the past 30 to 60 years (Figure SYR.1a; Box SYR.1) and is estimated to be around one-third to one-half of the trend over the period 1951–2012. Nevertheless, the decade of the 2000s has been the warmest in the instrumental record (Figure SYR.1a).”

Traducido: Sí, las temperaturas no han subido tanto como decíamos, a pesar de que hemos emitido CO2 más rápido que nunca. Olvidemos que esto indica que los modelos estaban mal. Lo importante es usar las frases “más caliente” y “récord” lo más posible.

“The radiative forcing of the climate system has continued to increase during the 2000s, as has its largest contributor, the atmospheric concentration of CO2. Consistent with this radiative forcing, the climate system has very likely continued to accumulate heat since 1998, and sea level has continued to rise. The radiative forcing of the climate system has been increasing to a lesser rate over the period 1998-2011 compared to 1984 to 1998 or 1951-2011, due to a negative forcing trend from volcanic eruptions and the downward phase of the solar cycle over 2000-2009. However, there is low confidence in quantifying the role of forcing trend in causing the surface-warming hiatus, because of uncertainty in the magnitude of the volcanic forcing trend and low confidence in the forcing trend due to tropospheric aerosol. {WG1 8.5; WG1 Box 9.2}”

Traducido: A pesar del hecho de que el calentamiento es más lento de lo que venía siendo, el CO2 en teoría nos está calentando más rápido. Esto es una contradicción fatal, pero esperamos que no lo notes. Te distraeremos diciendo que la tasa de de incremento en el forzamiento teórico ha disminuido en nuestras estimaciones de los volcanes y la cosa solar, y apostamos a que suene como que coincide y como que sabemos de lo que hablamos. Pero admitimos que realmente no tenemos mi idea de por qué no está habiendo calentamiento. Lee entre líneas – sabemos que el CO2 es importante porque nuestros modelos no funcionan sin él – pero nuestros modelos no funcionan en cualquier caso, no entendemos los otros factores. La ciencia está establecida, excepto por las inconvenientes, impredecibles partes que no están establecidas. Danos dinero.

“For the period 1998–2012, 111 of the 114 climate-model simulations show a surface-warming trend larger than the observations (Box SYR.1, Figure 1a). There is medium confidence that this difference between models and observations is to a substantial degree caused by unpredictable internal climate variability. Variability sometimes enhances and sometimes counteracts the long-term externally forced warming trend  (Figure Box SYR.1). Internal variability thus diminishes the relevance of short trends for long-term climate change. There are also possible contributions from inadequacies in the solar, volcanic, and aerosol forcings used by the models and, in some models, from too strong a response to increasing greenhouse gases and other anthropogenic factors. {WG1 2.4, 9.3, 9.4; 10.3, 11.2, 11.3, WG1 Box 9.2}

Traducido: la certidumbre del 95% se parece a esto: el 97% de nuestros modelos está mal. (Mira aquí también). Lo achacamos a las partes que son impredecibles de lo que hay en el clima. Tal vez también estemos equivocados en lo del sol, los volcanes y el polvo.

“In summary, the observed recent surface-warming hiatus is attributable in roughly equal measure to a cooling contribution from internal variability and a reduced trend in external forcing (expert judgment, medium confidence). {WG1 8.5, Box 9.2}

Traducido:Esta frase ofrece gran confianza porque atribuimos “la pausa” a algo. No mires de cerca, es un enfriamiento que no previmos de antemano, seguimos sin poder predecir, y no lo podríamos medir aunque lo pudiéramos predecir. La “variabilidad interna” es el nuevo comodín que cubre todo aquello que no sabemos. Es el maquillaje multiuso para cualquier ocasión. Esperamos que nadie pregunte si la “variabilidad interna” podría haber causado el calentamiento antes de que causara el enfriamiento.

Añadido: Nos encantan las expresiones como “juicio experto”. Nos hace sentir importantes.

“Footnote: The connection of the heat budget to equilibrium climate sensitivity, which is the long-term surface warming under an assumed doubling of the atmospheric CO2 concentration, arises because a warmer surface causes enhanced radiation to space, which counteracts the increase in Earth’s heat content. How much the radiation to space increases for a given increase in surface temperature, depends on the same feedback processes that determine equilibrium climate sensitivity.

Traducido: Las realimientaciones son la ruina de todo el cuento, así que solo usamos el término una vez, y en una nota a pié de la página 21, en un documento de 99 páginas (en el que ahora mismo figura “Prohibido citar, copiar o distribuir” en todas las páginas). No esperamos que ningún periodista entienda lo que significa este párrafo, ni que pregunte sobre él, pero si los escépticos pretenden que negamos que las realimentaciones determinan el resultado final, podemos señalar esta parte escondida y decir que somos completamente transparentes.

Una mirada a los clásicos mantras del IPCC:

“….. la década de los 2000 ha sido la más caliente del registro instrumental …”

Traducido: No podemos hablar del calentamiento en los últimos 17 años, así que esta es una buena frase mete-miedo para usar a cambio. No diremos que el calentamiento empezó hace 300 años, mucho antes de que empezaran nuestras emisiones de CO2. No diremos que nuestros registros de temperatura son patéticamente cortos, ni que el mundo estaba más caliente durante la mayor parte de los miles de años desde le revolución agrícola [neolítica -pm]. ¿Quién necesita saberlo? Tampoco mencionaremos que muchos de los registros de temperatura dependen enteramente de nuestros ajustes. Todos aquellos termómetros antiguos marcaban demasiado alto.  Tuvimos que ajustarlos. Es extraño que llevara 70 años “corregir” esas mediciones. (Tomar las temperaturas no era algo muy riguroso  en los días de la bomba atómica y la llegada a la luna. No se puede esperar que los científicos hicieran algo tan complejo como medir la temperatura del aire con precisión, si un modelo de computadora)

“….. su principal causante, la concentración atmosférica de CO2 ….”.

Traducido: El CO2 es la causa principal de calentamiento porque nuestros modelos que no funcionan lo dicen así. Lástima que no haya un estudio aceptado y replicado que proporcione un enlace cuantitativo entre el CO2 y la temperatura atmosférica. Es aquí donde mencionaríamos un estudio así, si lo pudiéramos encontrar. A cambio, damos un amplio rango de posibilidades, y de hecho el CO2 podría no tener efecto alguno, es algo que no se ha demostrado. Pero no hace falta que aclaremos todo eso.

“….. el sistema climático muy probablemente ha continuado acumulando calor desde 1998 …”

Traducido: De hecho no estamos absolutamente seguros de que el sistema climático se haya seguido calentando. Esta es nuestra manera de decir que es posible que no haya habido calentamiento desde 1998.

“….. el mar ha continuado subiendo …”

Traducido: Muchas cosas causan subida del nivel del mar, de las cuales muchas no tienen relación con calentamiento global o con CO2, pero no lo mencionaremos. Tampoco diremos que mucha de esa subida puede deberse a los ajustes de los datos de los satélites. Siempre que podamos filtrar unos datos muy ruidosos, podemos decir que continua una subida. Un filtro de diez años se puede convertir en un filtro de veinte, y este en uno de 20.000 si hace falta.

” ….. produce radiación ampliada al espacio, que contrarresta el incremento en el contenido de calor de la tierra …”

Traducido: Ver Stefan Boltzmann 1879.

…………………………………………………………………………..

Los modelos se encuentran con la realidad.

La realidad es que la temperatura de superficie que se suele mostrar en los mapas de temperatura global no ha cambiado significativamente de 1997 a 2013. Esta pausa no estaba prevista por los modelos climáticos existentes. Fallo. Los modelos climáticos ajustados siguen yendo más calientes que la realidad. Fallo.

Gracias a los escépticos, el IPCC se ha visto forzado finalmente a admitir (en un lenguaje retorcido) que sus modelos don funcionan porque no entienden el clima.

Docenas de modelos, millones de datos, más CO2 emitido que nunca, y los modelos se calientan y arden.  | Graph: John Christy. Data: KMNI.

Ver este post para mayor información sobre el fallo de los modelos.

El cuento y las relaciones públicas seguirán engañando a cualquier periodista crédulo que quiera creer, y el IPCC  proporciona un montón de frases citables convenientes. Tratemos de ayudar a esos periodistas …

11 Dec 08:04

Google Play Books now allows direct uploading from Android devices

by Juan Carlos Torres
Miguel A. Arévalo

Importante que se vean los dispositivos móviles tambien como origen de esos contenidos... alguien va a hacer de una vez un editor de EPUBs para Android !?

Google has just pushed out an update to its Google Play Books app that brings in a couple of improvements and new features. In particular, it has now added support for uploading books to Google Play directly from the device itself.

google-play-books

The performance of Google Play Books has been improved in this release. The changelogs claim that books will now open faster and smoother. It also tries to improve the reading experience by setting a lower floor limit for brightness settings and offering support for landscape mode, which would be a good fit for larger tablets. Performance and stability have also been improved across the board.

One particular new feature that might interest Play Books users is the new upload support. In the past, you can only upload your own books stored on your computers via the the Google Play Books web interface. Now users will be able to do so straight from their device. Simply tap on the file from an email or from downloads and you will be offered an option to upload to Play Books, which will make the book available across all your Android devices. Supported file types are, however, still limited to EPUB and PDF.

The new version of the Google Play Books app is now up on Google Play Store. As always, it will take some time for the update to arrive on all regions, so don’t panic if you still don’t have the new upload feature.

Download: Google Play Books on Google Play Store

09 Dec 13:26

Give Your Kids Controllers That Aren't Plugged in for More Gaming Time

by Melanie Pinola
Miguel A. Arévalo

No sé si es genial o cruel...

Give Your Kids Controllers That Aren't Plugged in for More Gaming Time

Once you have kids, you no longer have the luxury of spending hours playing video games. That is, unless you get them to play with you...but not really.

Read more...


    






05 Dec 10:07

Kids These Days...

Miguel A. Arévalo

Alguien debería haberle explicado a ese niño que hay acortadores de direcciones...

Kids These Days...

Link for the lazy.

Submitted by: Unknown (via kcapulet)

05 Dec 09:27

Detroit suspende pagos y abre un 'melón' de tres billones de euros

by PABLO PARDO
Miguel A. Arévalo

En estos casos además de una suspensión de pagos se impone un concurso de acreedores... hacia la anarquía, de la ley... a la ley.

La Justicia de Estados Unidos ha aceptado la suspensión de pagos de la ciudad de Detroit. La decisión supone dar 'luz verde' a la reestructuración de la deuda de la 'capital del automóvil', en lo que constituye la mayor suspensión de pagos de una entidad pública de EEUU desde que en 1839-1843, nueve estados de ese país suspendieron pagos. Detroit tiene unos pasivos de entre 18.000 y 20.000 millones de dólares (de 13.200 a 14.800 millones de euros).

El juez Steven Rhodes aprobó la petición de suspensión de pagos de Detroit porque los 700.000 habitantes del municipio "están sufriendo" por la crisis de la ciudad, la legislación estadounidense en suspensiones de pagos es mucho más benévola que la europea, y en la práctica supone una reestructuración de la deuda.

La sentencia, sin embargo, abre dos 'cajas de los truenos'. La primera, que Rhodes ha declarado que, como parte de la reestructuración de la deuda Detroit podrá recortar las pensiones de sus empleados públicos, algo a lo que éstos se oponen en redondo.

La segunda es que abre la puerta a más suspensiones de pagos municipales gigantes en EEUU. El mercado de los bonos municipales (los llamados 'munis') acumula 4 billones de dólares (3 billones de euros) en inversión privada, gracias al favorable tratamiento fiscal que reciben estos activos. La posibilidad de más suspensiones de pagos como la de Detroit ha reducido el apetito de los inversores por estos bonos.

05 Dec 08:51

Stealing $100 Million in Bitcoins

by Brian Benchoff
Miguel A. Arévalo

Desde luego Bitcoin sirve para todo... ya no es tan fácil robar :-D

bitcoin

In early October of this year, online Bitcoin marketplace and ‘the eBay of drugs’ The Silk Road was taken down by the FBI. Just after the black vans took Silk Road head honcho [Dread Pirate Robberts] away, a new Bitcoin marketplace came onto the scene called Sheep Marketplace. Sheep Marketplace closed after revealing that 5400 bitcoins – or $5.8 million USD were stolen by the user EBOOK101 by exploiting a bug in the Sheep site.

Over this last weekend, it was revealed this bug in the Sheep Marketplace site wasn’t responsible for the loss of 5,400 coins, but instead 96,000 BTC, or $100 million USD, making this one of the largest thefts of all time.

Whoever was responsible for this theft didn’t make a clean getaway. Because the Bitcoin block chain records the history of every transaction, laundering bitcoins is harder than it seems. The most common method is to ‘tumble’ the bitcoins – sending them through multiple wallets, combining and recombining them, until tracking groups of bitcoins just becomes too hard.

[sheeproadreloaded2] over on Reddit managed to track these bitcoins to this bitcoin address, an amazing feat that also means there are 96,000 coins in a wallet somewhere that can’t be spent or cashed out without the thief telling the world who he is.

As far as crimes of the century go, this one is at least in the top ten. Unless the thief behind this heist is extraordinarily smart, though, his identity will most likely be found out eventually.


Filed under: news
18 Nov 11:07

Paul Ramsey: BC Electoral Redistribution and Population Dispersion

Miguel A. Arévalo

Mooola, pensaba que venía de mis feeds políticos pero viene de los de PostgreSQL...

I've been asked by my technical readers to stop writing so much about politics, but I cannot help myself, and this week I have the perfect opportunity to apply my technical skills to a local political topic.

History and Background

Like Britain and the USA (and very few other jurisdictions anymore), British Columbia has a first-past-the-post representative electoral system. The province is divided up into "electoral districts" (also known as "ridings" in the British tradition) and each district returns a single member to the Legislature. In each district, the candidate with the most votes wins the district, even if they only obtain a plurality. In the Legislature, the party with most seats forms government.

In such a system, the geographical layout of the districts has a great deal of importance, because it is possible for a party to win a majority of votes in the province, but a minority of seats in the legislature, if the votes are concentrated in particular seats. This actually happened in British Columbia in 1996. It also happened in the USA in the 2000 Presidential election, since their presidential Electoral College effectively acts like a weighted version of a first-past-the-post Legislature.

In a representative democracy, it's important that everyone's vote have the same weight, which means ensuring that the each district has approximately the same number of people in it. As relative populations grow in some regions and shrink in others, districts can become unbalanced, and districts need to be redrawn. In the USA, this "redistricting" process is often driven by partisan considerations, and can lead to districts like this (try out this awesome gerrymandered districts puzzle game):

Fortunately, since 1998 British Columbia's districts have been drawn every 10 years by non-partisan "Electoral Boundaries Commissions", and the primary consideration has been creating districts that are as equal in population as possible while allowing for effective representation.

Effective Representation

BC is a big place and a place of extremes. The smallest district in BC is in downtown Vancouver (Vancouver-West End), with an area of under 500 hectares: it takes less than 30 minutes to walk across it and 48,500 people live there.

The largest district is in the north-west of the province (Stikine), with an area of almost 20,000,000 hectares: about the size of Ireland, Switzerland, Denmark and the Netherlands, combined. Just over 20,000 people live in it. When you're dealing with areas this sparsely populated "effectiveness of representation" begins to have some concrete meaning.

On the other hand, the principle of "one person, one vote" is the corner-stone of democracy, and the goal of an electoral boundary re-distribution is to try to achieve it, as far as possible. There is a tension inherent in the process.

2008 Commission Catastrophe

Population growth in BC over the last generation has been concentrated in the south: mostly in Vancouver and its suburbs, with some in Vancouver Island and Kelowna. Commission have dealt with this growth by a combination of increasing the number of seats in the Legislature, and slowing increasing the size of the rural districts.

In 2008, this process reached a tipping point, as the Commission recommended two new seats, and the transfer of three seats from rural areas to urban areas. Rural BC exploded in anger, and the government of the day rushed in legislation directing the Commission to add more seats than recommended and to avoid removing seats from certain rural areas.

At this point, though process remained non-partisan (both parties in the Legislature supported the new plan), it had become thoroughly politicized (the carefully considered deliberations of the Commission had been hastily overturned by politicians for public relations purposes).

Formalization of Politicization

No doubt remembering the tumult of the 2008 experience, the current government of BC has released a proposal for the rules governing the next Electoral Boundary commission. The proposal aims to avoid a messy politicization of the process at the end, by politicizing quietly it in advance:

  • The Commission may not recommend adding any further seats to the Legislature
  • The Commission may not remove seats from three protected regions: North, Columbia-Kootenay, and Cariboo-Thompson

The protected regions look like this.

Note that the Okanagan region is isolated from the rest of the "unprotected" areas of the province, making it impossible to juggle population into or out of the region. That means the Okanagan can either gain or lose a whole seat, but never lose a "half" a seat by having population juggled in or out via boundary changes.

Anyone with a passing familiarity with BC electoral geography will recognize that this proposal entrenches an already large and growing deviation from the principle of one-person-one-vote, but I want to calculate just how large, and also to measure the "fairness" of this particular proposal.

Population

The electoral district boundaries of BC are readily available as GIS files online, but do not have population information attached (and would be out of date if they did, since they pre-date the most recent census).

Similarly, the StatsCan boundary files can be downloaded, and the attribute file giving the census 2011 population in each block is also available. There are about 500-800 blocks in each electoral district, making for a very fine-grained profile of where people are concentrated in each district.

I loaded the GIS files into a PostGIS spatial database for analysis. Once the electoral districts (ed) and dissemination blocks (db) were loaded, calculating the electoral district population in PostGIS was a simple spatial join query:


SELECT ed.edabbr, ed.edname, sum(db.popn) as popn
FROM ed, db
WHERE ST_Intersects(ed.geom, db.geom)
AND ST_Contains(ed.geom, ST_Centroid(db.geom))

The results of this calculation and others in this article can be seen in the redistribution_bc.xlsx spreadsheet I've placed online.

A quick summary of the population results shows that, among other things:

  • The current distribution is extremely lopsided, with the most heavily populated riding (Surrey-Cloverdale, 73042) having well over 3 times the population of the least populated (Stikine, 20238)
  • The current provincial average population is 55804
  • The average population in the "protected" regions is 35609, 31% less that the provincial average
  • The average population in the "unprotected" regions is 55804, 8% higher than the provincial average
  • A vote in the protected regions will be over 1.5 times more "powerful" than one in the unprotected regions
  • Of the 85 ridings, 26 are below average and 59 are above average, indicating that the problem of underpopulation is concentrated in a minority of ridings

There is no doubt that the government proposal will enshrine the regional imbalance in representation, and further worsen at as continued migration into the south pushes the balance even further out of line.

"Fair" Imbalances

Legal challenges to imbalanced representation have resulted in court decisions that indicate that it is constitutional within limits, and with reasonable justification. The limits generally accepted by the courts are +/- 25% of the provincial average, and the starting point of this proposal already exceeds that on average--some individual ridings (like Stikine) will be much worse. Political commentators in BC are already musing about whether ridings built under this scheme would survive a court challenge.

Of more analytical interest is whether the scheme of selecting "protected regions" is a good one for choosing which ridings should receive preferential treatment.

"Representing" a riding involves being available to your constituents, meeting with other orders of government in your riding (cities, school boards), and attending local events. Representation is very much tied up with being where the people are.

  • If the people are all in one place, near together, then representing them is easy.
  • If the people are spread out, in many different localities, then representing them is hard.

Can we measure the "spreadoutness" of people? Yes, we can!

Each riding contains several hundred census dissemination blocks, each of which has a population associated with it. Imagine measuring the distance between each block, and all the other blocks in the riding, and weighting that distance by the population at each end.

For Vancouver-Fairview, the picture looks like this.

The blocks are fairly regular, the population is all very close together, and the dispersion is not very high.

For Skeena, the picture looks like this.

The population is concentrated in two centers (Terrace and Kitimat) reasonably far apart, giving a much higher dispersion than the urban ridings.

In mathematical terms, the formula for "dispersion" looks like this.

In the database, after creating a table of census blocks that are coded by riding, the calculation looks like this.


WITH interim AS (
SELECT
a.ed,
max(a.edname) AS edname,
sum(b.popn * a.popn * st_distance_sphere(a.pt, b.pt)) AS score,
sum(b.popn * a.popn) AS p2
FROM popn a, popn b
WHERE a.ed = b.ed
GROUP BY a.ed
)
SELECT ed, edname, score/p2/1000 AS score
FROM interim;

Taking the ratio of the distance scaled populations against the unscaled populations allows populations that are far apart to dominate ones that are close together. Scaling the final result down by 1000 just makes the numbers more readable.

As before, the results of this calculation and others in this article can be see in the redistribution_bc.xlsx spreadsheet.

Is Regional Protection "Fair"

Using the measure of dispersion allows us to evaluate the government proposal on its merits: does protecting the North, Kootenays, Cariboo and Thompson protect those ridings that are most difficult to represent?

In short, no.

The regional scheme protects some difficult ridings (Stikine) but leaves others (North Island) unprotected. It also protects ridings that are not particularly dispersed at all (Kamloops-South Thompson), while leaving more dispersed ridings (Powell River-Sunshine Coast, Boundary-Similkameen) unprotected.

Among the larger ridings, Skeena is notable because even though it is the 10th largest riding by area, and 10th sparsest by population density, it's only the 17th most dispersed. There are many smaller ridings with more dispersion (Powell River-Sunshine Coast, Nelson-Creston, Boundary-Similkameen). This is because most of the people in Skeena live in Terrace and Kitimat, making it much easier to represent than, say, North Island. Despite that, Skeena's population is 43% below the average, while North Island's is 5% above.

Kamloops-South Thompson is the least dispersed (score 15.2) protected riding, and it's worth comparing it to the similar, yet unprotected, Nanaimo-North Cowichan (score 16.2).

Kamloops-South Thompson (protected) consists of a hunk of Kamloops, and a string of smaller communities laid out to the east for 50KM along Highway 1.

Nanaimo-North Cowichan (unprotected) consists of a hunk of Nanaimo, and a string of smaller communities laid out to the south for 45KM along Highway 1 (and some settled islands).

What is it about Kamloops-South Thompson that recommends it for protected status along with truly dispersed difficult ridings like Stikine? Nothing that I can determine.

Let the Commission Work

The intent of the government's proposal to amend the Redistribution Act is clearly to avoid the firestorm of protest that accompanied the 2008 Commission report, and it's good they are thinking ahead.

They need to think even further ahead: the consequences of having the boundaries enacted, then reversed in court, will be far more disruptive than allowing the Commission to proceed with the necessary work of redistributing BC's districts to more fairly reflect our actual population distribution.

The end result of an unconstrained Commission will be fair boundaries that still reflect the representation needs of dispersed ridings by giving them lower populations within the limits already acknowledged by the courts: +/- 25% with a handful of exceptions (I'm looking at you, Stikine).

I encourage you to explore the data on dispersion, and how it relates to the regional "protection" scheme, in the redistribution_bc.xlsx spreadsheet.


14 Nov 08:48

When facts aren’t facts

by Russ Roberts
Miguel A. Arévalo

Fundamental para los que usan este tipo de encuestas para hablar de desigualdades y su aumento... La paradoja de Simpson

(Russ Roberts)

One standard explanation for the increase in measured inequality over the last few decades is that innovation has enhanced the productivity of highly educated people more than the productivity of less educated people. That in turn has increased the wages of college graduates relative to those who have only graduated high school. In the New York Times, Eduardo Porter refers to Larry Mishel of the Economic Policy Institute who is skeptical of this phenomenon. Porter writes:

This rendition of history suggests that improvements in technology — coupled with a college graduation rate that slowed sharply in the 1980s — have been principal drivers of the nation’s widening income gap, leaving workers with less education behind.

But critics like Mr. Mishel point out that this theory has important blind spots.

For instance, why have wages for college graduates stagnated over the last decade, even as innovation continues at a breathtaking pace? Between 2000 and 2008 the typical earnings of men with at least a bachelor’s degree fell by more than $2,000, after inflation, to $70,332 a year. Between 2008 and last year they fell a further $3,500. Though somewhat less pronounced, the pattern is similar for women.

This seems to refute the claim that technology gives college graduates an advantage. Surely there was technological change between 2000 and 2008, yet earnings of college grads fell. This calls into question the relationship between technology and inequality, but it’s alarming in its own right. It suggests that the rules of the game are rigged in some way. People are doing the right thing–they are going to college–and yet their salaries are falling. Here’s the key claim:

Between 2000 and 2008 the typical earnings of men with at least a bachelor’s degree fell by more than $2,000, after inflation, to $70,332 a year.

What does that mean? It seems pretty straightforward. But it’s easy to misinterpret. Here is what it does not mean. It does not mean that people who had a college degree in 2000 found themselves with lower income eight years later. Let me say that again. It does not mean that people who had a college degree in 2000 found themselves with lower income eight years later. It COULD mean that. But it doesn’t have to.

Why not? Because the people who were sampled in 2000 and 2008 are not the same people. What the statement does mean, is that if you take the median income of people with at least a bachelor’s degree in 2008, that number is lower than the median income of all the people with at least a bachelor’s degree in 2000. (This assumes that prices are measured correctly. All of the income comparison in this post correct for inflation using standard methods. There are problems with those methods, but let’s put that issue aside.)

You might think that this is a nit-pick. How much can the population of highly educated people change in just eight years? Sure, some people retire or die. Others enter the labor force. But those effects must be small.

They’re not. They’re potentially huge and they make the comparison meaningless for trying to figure out if it’s a good idea to go to college or if the economy is rewarding college graduates. Stick with me and you will learn one of the coolest things there is to learn about statistics and data and the elusiveness of truth. Sometimes something that looks like a fact isn’t close to a fact. And you’ll discover something pretty amazing that happened between 2000 and 2012 that you might not know about.

If you want the details, go below the fold…

Comparisons between 2000 and 2012 might not show what you think they show because there could be changes in not just who is working but in how much they work and when they work. So it’s possible that the proportion of the work force that works full-time is different, or all year vs. part-year. And even when you hold those things constant, men make more than women. So another variable that can change over twelve years is the proportion of the work force that is male vs. female. So to hold all those factors approximately constant, I’m going to look at just women who work full-time, year-round and who are over 25. I pick the age point just because that’s how the Census lists it.

I’m going to use a slightly longer time horizon, 2000 to 2012, so I can use the latest data. It also makes the point I want to make a little more dramatic.  But if I did it for 2000-2008, I’d get the same effect. The bottom line is that you can’t draw simple conclusions from what look like simple transparent changes.

Between 2000 and 2012, here are the changes in real median income for women 25 and older who were defined as working full-time, year-round by education level. The data I’m using are census data from here.

Less than 9th grade                    -3.7%

9th-12th but didn’t finish          -6.7%

High school graduate                 -3.3%

Some college but no degree       -3.7%

Associate’s degree                     -10.0%

Bachelor’s degree or more        -2.7%

Looks like a pretty bleak 12 years, doesn’t it? And sure enough, these numbers echo the Mishel claim listed above–college grads seem to be doing worse.

Given those numbers, what do you think happened to the median income for all women, regardless of education level? If you don’t think about it very much, you’d take an average of the six decreases in the data for each income group. Remember, those six categories are exhaustive–they cover all women 25 and over who work full-time year-round in the workforce. The average change across all groups turns out to be almost exactly 5%. So if you didn’t think about it much, you’d guess that for all women 25 and over who worked full time, income fell about 5%. You’d think that might be a pretty good approximation. But you’d be wrong.

Or maybe you’d realize that all the groups aren’t the same size, so maybe you shouldn’t just take the average. Maybe you should take a weighted average. But you’d be pretty confident that the overall change in the median income of women would be somewhere between -2.7 and -10, the range between the smallest change and the biggest change.

But you’d be wrong. The income of women over the age of 25 who worked full time actually increased between 2000 and 2012. It went up 2.8%. (Again, all these numbers are corrected for inflation.) Not a great twelve years. But very different from a 5% drop.

Now that seems impossible. I must have made a mistake. If every sub-group went down, how could the total for the whole group go up?

But it is possible. How? The answer is that the women in the full-time workforce 25 and over in 2000 were not the same women in 2012. In particular, they were a lot better educated. But how much better educated could they be in twelve years? Or a better way to put it, how much could the educational composition of the full-time female labor force change in a mere 12 years?

A lot. More than I would possibly have come close to guessing. In 2000 30.5% of all female workers who were 25 and older and who worked full-time and year-round had at least a bachelor’s degree. In only 12 years, that number had increased to 41.8%. That’s an enormous change, a 37% increase in the proportion. And it’s the category that has the highest overall pay of all six categories. So if you increase the proportion of the population in the highest paying group, the overall average can go up even when each group, including the group that pays the most, goes down.

Here’s another way to see it. In 2000, 11.6 million women 25 and over who were working full-time had at least a bachelor’s degree. Twelve years later there were 17.2 million women 25 and older working full-time with at least a bachelor’s degree.

That’s incredible. Between 2000 and 2012, the number of women with at least a bachelor’s degree increased by 48% in absolute numbers. That’s a silent revolution in the work force. (This phenomenon is called Simpson’s Paradox by the way. Floyd Norris of the New York Times has a similar example here.)

And now you can understand the complexity of claiming that college grads are doing worse than they did before. They’re not the same people. When a lot more women are going to college and entering the labor force with a college degree, you wouldn’t expect them to necessarily have the same income as the people who have already been to college. For one reason, they’re likely to be younger with less experience. But as the number of people going to college increases, you might expect their starting salaries to be lower than workers from before. The pool is less selective. And of course, there are a hundred other reasons their salaries could be lower–maybe they major in different things than people did in the past.

My point is that you can’t prove much by observing that salaries for people with a college education are going down over some period. It doesn’t mean that people who already have college degrees are doing worse. They could be doing better, but the starting salaries of the newcomers can pull down the median and the average as well.

The bottom line is that the “stagnation of college grad salaries” is misleading as a measure of the health of the economy and even of the health of being who have already graduated and who are working. It can actually be an encouraging measure–one of the reasons college grad salaries can stagnate is because more people are going to college and the new grads have lower salaries than the existing median. But the new grads are better off than they would be than if they had not gone and their salaries do not necessarily mean that those who would have gone no matter what are doing worse.

14 Nov 08:01

YotaPhone, the dual screen Android phone, set for release in December

by Nate Swanner
Miguel A. Arévalo

Mooola, a ver qué tal cala...

The YotaPhone, an odd experiment at dual-screen compatibility, is finally going to see a launch. Set for December of this year, Russian OEM Yota is set to bring the two-faced device to the world. The launch will be “international”, and the YotaPhone will come out before Christmas.

YotaPhone


The device pairs a run-of-the-mill color screen on one side, and a Kindle-esque eInk display on the other. We first saw the device at CES this year, then again at MWC in Berlin. Now that Yota has lined up their supply chain, it seems they’re ready to bring the YotaPhone to the masses. While we’re excited to see if the concept has real-world merit, we’re not excited at the potential price. Early estimates put it at about 500 Euro, or $750.

If you’re looking for a spec-heavy monster, the YotaPhone won’t help you. It’ll sport a dual-core 1.7GHz chipset on the back of 2GB RAM. The main screen (color) is a 4.3-inch 720p display, and the battery is a disappointing 1,800mAh. With such a diverse set of hardware, we wonder why — or how — it’ll keep up with itself. Clearly meant for lesser tasks, the price tag doesn’t seem to make the stretch worth it.

We’ll hold out judgement until we get a look at it, but even with all that Android can do for lesser devices, we’ll have to see just what this is capable of. The dual screen throws us off a bit, but mostly because we wonder why it’s there, and what we would want to do with it. More to the point, what apps or services can use it? Even with today’s confirmation, we’re still confused by the YotaPhone.

13 Nov 09:28

Android eats up 81% of smartphone market, average prices falling, says IDC

by Juan Carlos Torres
Miguel A. Arévalo

mmmm, 80%

IDC has released its figures for smartphone shipments for the third quarter of this year and Android is yet again at the top of it all. The market intelligence company attributes this lead, as well as the rather surprising growth of Windows Phone devices, to steadily declining average selling price in the market.

nexus-smartphone-2

Compared to the same quarter last year, the smartphone market grew by 39.9 percent, shipping out 261.1 million units in the third quarter of 2013. Unsurprisingly, majority of these, about 81.0 percent or 211.6 million units, all run Android. The Android market has grown by as much as 51.3 percent from its 74.9 percent market share from the same period last year. This covers all Android devices in all prices and sizes.

idc-q3-2013-smartphone-shipments

What is perhaps remarkable is the performance of devices carrying the Windows Phone OS. While still far below 5 percent, Windows Phone devices have managed to climb up to third spot with 3.6 percent, a remarkable 156 percent increase from last year, displacing BlackBerry. iOS also saw a slight decline from 14.4 percent last year to 12.9 percent. IDC, however, predicts that the fourth quarter might bring in a growth surge if the market reception of the newly launched iPhone 5s and iPhone 5c is any indicator.

idc-q3-2013-average-selling-price

According to IDC, this growth in Android’s and Windows Phone’s shares can be attributed to both platforms’ capability to address the low-price smartphone market, a segment that Apple’s iOS is probably not interested in. Average selling prices or ASPs of smartphones declined by 12.5 percent this year at $317. Another factor to consider, says IDC, is the rising popularity of phablets, smartphones sporting sizes of 5 to 7 inches, with almost all major Android vendors and even Nokia jumping on the bandwagon. Apple’s refusal to play in that arena might have caused iOS some shares in the third quarter as well.

SOURCE: IDC

12 Nov 13:15

Payasada en el Congreso para pedir la Tasa Tobin

Miguel A. Arévalo

Estas son la cosas que me preocupan de UPyD...

PSOE,IP,PNV,CIU,UPyD, a las puertas del Congreso I @tasarobinhood.

Libre Mercado 2013-11-12

Vuelve la Alianza Tasa Robin Hood y lo hace con el apoyo de toda la oposición. Con gorrito de Robin Hood en mano, PSOE, CiU, Izquierda Plural, PNV y UpyD han querido mostrar su apoyo a esta plataforma que tiene como objetivo apoyar el impuesto sobre las transacciones financieras internacionales, conocido como la Tasa Tobin.

A las puertas del Congreso de los Diputados se han reunido todos los representantes de la oposición quienes no han dudado en hacerse una foto de familia junto con sus complementos verdes y subirla a Twitter.

Lo que reivindican es que España apruebe el proyecto de ley necesario para trasponer la directiva europea correspondiente y que en 2014 se pueda aplicar este impuesto. Para la portavoz del PSOE en el Congreso, Soraya Rodriguez, la Tasa Tobin es un impuesto "a los grandes beneficios obtenidos de la especulación financiera". Rodríguez no ha dudado en advertir, con ciertos tintes dramáticos, que es "la hora de la verdad" para España y para toda la Unión Europea quien "deben decidir si se colocan del lado de los ciudadanos o del capital".

Aitor Esteban, del PNV, ha negado rotundamente que aplicar un impuesto a las transacciones de dinero entre países vaya a dañar el comercio internacional, "porque el porcentaje de gravamen es tan bajo que no impedirá ninguna operación". Entonces, si esta cantidad es tan poco relevante, ¿cuál es el motivo de los disfraces?

Hay que recordar que según el último estudio elaborado por el think tank New Direction de Bruselas, la Tasa Tobin destruirá más de 80.000 empleos en España y hasta hasta 640.000 puestos de trabajo en toda la UE por la pérdida de actividad económica.

12 Nov 13:06

Keep Your SD Cards Data Safe with the SD Locker

by Adam Fabio
Miguel A. Arévalo

Cursioso, no sabía que existía esto...

sdlocker_1

[Karl Lunt] has come up with a simple circuit for protecting data you have stored on SD cards. As is relatively well-known, the little lock switch on the side of most SD cards really doesn’t do anything more than the switch on floppies or the tabs on VHS or cassette decks. It’s up to the reader/writer to check the status of the tab and decide if it should write to the card or not. Not a very safe system. However, it’s not the only write protection system built into SD and SDHC cards. As part of the standard, cards have three protection methods: A TMP_WRITE_PROTECT bit, a PERM_WRITE_PROTECT bit, and a PWD register.

The PERM_WRITE_PROTECT bit permanently write protects the card. The bit can not be reset, so you should be really sure you want to keep the data on the card forever. The PWD register is a password register. The card will not allow any access (read or write) unless a password is provided. The TMP_WRITE_PROTECT bit is a temporary write protect. This is the bit that [Karl] is working with. When TMP_WRITE_PROTECT is set, the card can be read but not written. Note that there is no true protection here, as anyone can modify the bit. However, this should stop grandma from accidentally deleting your wedding pictures.

[Karl's] device is very simple. A card is inserted into an Altoids tin enclosure. One button locks the card, another unlocks it. Three LEDs return status – power, card locked, and card unlocked. Under the hood, he’s using an Atmel ATmega328 to set and clear the TMP_WRITE_PROTECT bits. Power is provided by two AA batteries, and regulated with a Pololu 3.3v boost regulator. [Karl] has also included a serial port for control and debug information. We think this is a great hack, however one thing we’re not sure of is how or if these features are implemented in all cards. We’re relatively sure the name brand cards stick to the SD/SDHC spec sheet, but what about all the knockoff and no name brands from overseas?


Filed under: security hacks
12 Nov 09:00

Venezuela’s Price Police

by Alvaro Vargas Llosa
Miguel A. Arévalo

Venezuela está condenada a una hambruna o a una guerra civil...

Venezuelan President Nicolás Maduro

Venezuelan President Nicolás Maduro

The Venezuelan government’s war on price inflation is not a metaphorical one—last weekend President Maduro, who owes his title to April’s rigged election, ordered the military takeover of Daka, a chain of electronic stores, and the arrest of several managers from that and other retail companies. The rhetoric employed by Maduro was inevitably interpreted by the masses as an encouragement to loot—which is what they did in the city of Valencia.

Besides the images sent out by citizen-reporters in Valencia, the picture that best captures the essence of what is happening is the one tweeted by a government minister who tried to justify the measures. The photo shows a washer/dryer that, in the words of the minister, “cost 39,000 VEF on November 1 and today costs 59,000 VEF, a nearly 100 percent rise in a week...”

Yes, minister, that is precisely what happens when you are on the verge of hyperinflation! The inflation rate is now nearing 60 percent and, as anyone who has lived under those conditions (including yours truly) knows, going from 60 percent to 1,000 percent is a lot easier than going from 3 percent to 40 or 50 percent.

For many years Venezuela has produced nothing but oil (in decreasing quantities) and has therefore had to import pretty much everything the population consumes. The artificial level at which the government has kept the bolivar in spite of the massive outflow of hard currency required to meet those import needs has, of course, caused a severe drop in foreign-exchange reserves. Government controls and the militarization of the economy have not prevented the black market from taking the real exchange rate to a level ten times higher than the official one. No wonder people who have to obtain US dollars to import even the most basic stuff sell it at much higher prices than the government Wonderland economy dictates.

But that is not all. Since 2002 the money supply (M1) has grown at an annual rate of 54 percent while real GDP per capita has risen at an annual rate of just under 4 percent. The socioeconomic model based on producing nothing and consuming everything at subsidized prices, printing colossal amounts of money to paper over the government’s yawning fiscal gap, and waging literally a war on private businesses has led to a stagflation—minimal growth and skyrocketing prices that have turned ministers into photographers. A few weeks from the local elections that will be the first test of Maduro’s support since the rigged presidential election, the political consequences are already serious—he faces the possibility of such a big defeat that the numbers will simply overwhelm the government´s ability to commit yet another electoral fraud.

This explains Maduro’s desperate move over the weekend. The man who openly encouraged people to loot and ordered the military to takeover several stores around the country in an attempt to lower consumer prices understands all too well that he is in danger. He lacks Hugo Chavez’s charisma, he is presiding over a regime that exhibits growing cracks, and he is fast losing support according to the few polls one can trust in that environment.

We don’t know if he will be able to rig the next election at this point. But we do know this: He cannot win the war on inflation (soon hyperinflation) no matter how many stores he takes over, how many managers he throws in jail, and how many businesses he allows the masses to loot. Sooner or later, price inflation will cause his demise.

12 Nov 08:45

Reverse Engineering the Z80′s 16-bit Increment/Decrement Circuit

by Adam Fabio
Miguel A. Arévalo

Siempre es divertido ver que todavía puedes pasar de las puertas lógicas a las CPUs :-D

z80

Increment and decrement. They sound like simple functions. But even the simplest functions can get quite complex in a microprocessor design. Ken Shirriff has written up a great blog post about his reverse engineering of the Z80′s 16-bit increment/decrement circuit. The Zilog Z80 was one of the most popular microprocessors of the 70’s and 80’s. It was used in many classic computers such as the Osborne 1. These machines would often use the Z80 to run the popular CP/M operating system.

The increment/decrement circuit is responsible for updating the program counter register during normal (non branch) operations. The increment/decrement circuit also handles the stack pointer register during stack operations, as well as several other functions. One might wonder why a separate adder would be used when the microprocessor has a big ALU available to it. The answer is twofold. First the ALU is already in use handling user math operations. Secondly the increment/decrement circuit has to be fast. A generic ALU just won’t be fast enough.

One classic adding circuit is a Ripple Carry Adder. Ripple Carry Adders get the job done, but they are slow. Note slow is measured in nanoseconds here – there are no clocks involved in the circuit. The whole thing becomes a classic combinational logic optimization problem. Each layer of logic adds a gate delay to the circuit. As the carry has to ripple through all 16 bits, there are 16 gate delays before the final result is available at the outputs. Delays like these are what limits the maximum clock speed for a given circuit.

The Z80 uses some tricks in its increment/decrement circuit. The first is Carry-lookahead. A carry-lookahead circuit will calculate the carry values directly from the inputs. This reduces the gate delays significantly, but it requires more real estate on the die. A second trick is the carry-skip circuit. Carry-skip calculates the result for groups of bits rather than each bit individually. Again, it will reduce gate delays, at the cost of real estate. The actual Z80 implementation uses a mix of both circuits. Several other “helper” circuits are also used. Surprisingly the Z80 has specific logic just to check for 1 (0×0001) on the internal address bus. This circuit is used during memory move loops to inform other parts of the chip that a loop is about to complete. 


Filed under: classic hacks